Information-theoretic lower bounds on the oracle complexity of convex optimization
نویسندگان
چکیده
Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Given the extensive use of convex optimization in machine learning and statistics, gaining an understanding of these complexity-theoretic issues is important. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. We improve upon known results and obtain tight minimax complexity estimates for various function classes.
منابع مشابه
Information-Theoretic Lower Bounds on the Oracle Complexity of Sparse Convex Optimization
Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Recent years have seen a surge in optimization methods tailored to sparse optimization problems. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation, when the objective is optim...
متن کاملOn Convex Optimization, Fat Shattering and Learning
Oracle complexity of the problem under the oracle based optimization model introduced by Nemirovski & Yudin (1978) is considered. We show that the oracle complexity can be lower bounded by fat-shattering dimension introduced by Kearns & Schapire (1990), a key tool in learning theory. Using this result, we proceed to establish upper bounds on learning rates for agnostic PAC learning with linear ...
متن کاملOracle Complexity of Second-Order Methods for Smooth Convex Optimization
Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive (to the best of our knowledge) the firs...
متن کاملInformation-theoretic lower bounds for convex optimization with erroneous oracles
We consider the problem of optimizing convex and concave functions with access to an erroneous zeroth-order oracle. In particular, for a given function x → f(x) we consider optimization when one is given access to absolute error oracles that return values in [f(x) − , f(x) + ] or relative error oracles that return value in [(1− )f(x), (1 + )f(x)], for some > 0. We show stark information theoret...
متن کاملOn Lower Complexity Bounds for Large-Scale Smooth Convex Optimization
We derive lower bounds on the black-box oracle complexity of large-scale smooth convex minimization problems, with emphasis on minimizing smooth (with Hölder continuous, with a given exponent and constant, gradient) convex functions over high-dimensional ‖ · ‖p-balls, 1 ≤ p ≤ ∞. Our bounds turn out to be tight (up to logarithmic in the design dimension factors), and can be viewed as a substanti...
متن کامل